Selecting Likelihood Weights by Cross - Validation

نویسندگان

  • James V. Zidek
  • J. V. ZIDEK
چکیده

The (relevance) weighted likelihood was introduced to formally embrace a variety of statistical procedures that trade bias for precision. Unlike its classical counterpart, the weighted likelihood combines all relevant information while inheriting many of its desirable features including good asymptotic properties. However, in order to be effective, the weights involved in its construction need to be judiciously chosen. Choosing those weights is the subject of this article in which we demonstrate the use of cross-validation. We prove the resulting weighted likelihood estimator (WLE) to be weakly consistent and asymptotically normal. An application to disease mapping data is demonstrated.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Bayesian Nominal Regression Model with Random Effects for Analysing Tehran Labor Force Survey Data

Large survey data are often accompanied by sampling weights that reflect the inequality probabilities for selecting samples in complex sampling. Sampling weights act as an expansion factor that, by scaling the subjects, turns the sample into a representative of the community. The quasi-maximum likelihood method is one of the approaches for considering sampling weights in the frequentist framewo...

متن کامل

Asymptotic optimality of likelihood-based cross-validation.

Likelihood-based cross-validation is a statistical tool for selecting a density estimate based on n i.i.d. observations from the true density among a collection of candidate density estimators. General examples are the selection of a model indexing a maximum likelihood estimator, and the selection of a bandwidth indexing a nonparametric (e.g. kernel) density estimator. In this article, we estab...

متن کامل

Statistical models: Conventional, penalized and hierarchical likelihood

We give an overview of statistical models and likelihood, together with two of its variants: penalized and hierarchical likelihood. The Kullback-Leibler divergence is referred to repeatedly in the literature, for defining the misspecification risk of a model and for grounding the likelihood and the likelihood cross-validation, which can be used for choosing weights in penalized likelihood. Fami...

متن کامل

Cross Efficiency Evaluation with Negative Data in Selecting the Best of Portfolio Using OWA Operator Weights

The present study is an attempt toward evaluating the performance of portfolios and asset selectionusing cross-efficiency evaluation. Cross-efficiency evaluation is an effective way of ranking decisionmaking units (DMUs) in data envelopment analysis (DEA). Conventional DEA models assume nonnegativevalues for inputs and outputs. However, we know that unlike return and skewness, varianceis the on...

متن کامل

Gaussian mixture optimization for HMM based on efficient cross-validation

A Gaussian mixture optimization method is explored using cross-validation likelihood as an objective function instead of the conventional training set likelihood. The optimization is based on reducing the number of mixture components by selecting and merging a pair of Gaussians step by step base on the objective function so as to remove redundant components and improve the generality of the mod...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008